Causes of Catastrophic Forgetting in Class-Incremental Semantic Segmentation

نویسندگان

چکیده

Class-incremental learning for semantic segmentation (CiSS) is presently a highly researched field which aims at updating model by sequentially new classes. A major challenge in CiSS overcoming the effects of catastrophic forgetting, describes sudden drop accuracy on previously learned classes after trained set Despite latest advances mitigating underlying causes forgetting specifically are not well understood. Therefore, experiments and representational analyses, we demonstrate that shift background class bias towards CiSS. Furthermore, show both mostly manifest themselves deeper classification layers network, while early affected. Finally, how effectively mitigated utilizing information contained background, with help knowledge distillation an unbiased cross-entropy loss.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Overcoming Catastrophic Forgetting by Incremental Moment Matching

Catastrophic forgetting is a problem of neural networks that loses the information of the first task after training the second task. Here, we propose a method, i.e. incremental moment matching (IMM), to resolve this problem. IMM incrementally matches the moment of the posterior distribution of the neural network which is trained on the first and the second task, respectively. To make the search...

متن کامل

Catastrophic Forgetting in Connectionist Networks: Causes, Consequences and Solutions

All natural cognitive systems, and, in particular, our own, gradually forget previously learned information. Consequently, plausible models of human cognition should exhibit similar patterns of gradual forgetting old information as new information is acquired. Only rarely (see Box 3) does new learning in natural cognitive systems completely disrupt or erase previously learned information. In ot...

متن کامل

Catastrophic forgetting in connectionist networks.

All natural cognitive systems, and, in particular, our own, gradually forget previously learned information. Plausible models of human cognition should therefore exhibit similar patterns of gradual forgetting of old information as new information is acquired. Only rarely does new learning in natural cognitive systems completely disrupt or erase previously learned information; that is, natural c...

متن کامل

Catastrophic Forgetting, Rehearsal and Pseudorehearsal

This paper reviews the problem of catastrophic forgetting (the loss or disruption of previously learned information when new information is learned) in neural networks, and explores rehearsal mechanisms (the retraining of some of the previously learned information as the new information is added) as a potential solution. We replicate some of the experiments described by Ratcliff (1990), includi...

متن کامل

Measuring Catastrophic Forgetting in Neural Networks

Deep neural networks are used in many state-of-the-art systems for machine perception. Once a network is trained to do a specific task, e.g., bird classification, it cannot easily be trained to do new tasks, e.g., incrementally learning to recognize additional bird species or learning an entirely different task such as flower recognition. When new tasks are added, typical deep neural networks a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2023

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-031-26293-7_22